13 research outputs found

    Computer supported estimation of input data for transportation models

    Get PDF
    Control and management of transportation systems frequently rely on optimization or simulation methods based on a suitable model. Such a model uses optimization or simulation procedures and correct input data. The input data define transportation infrastructure and transportation flows. Data acquisition is a costly process and so an efficient approach is highly desirable. The infrastructure can be recognized from drawn maps using segmentation, thinning and vectorization. The accurate definition of network topology and nodes position is the crucial part of the process. Transportation flows can be analyzed as vehicle’s behavior based on video sequences of typical traffic situations. Resulting information consists of vehicle position, actual speed and acceleration along the road section. Data for individual vehicles are statistically processed and standard vehicle characteristics can be recommended for vehicle generator in simulation models

    AUTOMATIC VECTORIZATION OF INPUT DATA FOR MODELS OF TRANSPORTATION SYSTEMS

    Get PDF
    Digitálne (vektorové) údaje tvoria základ pre modelovanie dopravných systémov a riešenie optimalizačných úloh. V praxi sú tieto vstupné údaje často reprezentované rastrovými mapami a náčrtkami, ktoré je potrebné vektorizovať. Manuálna a poloautomatická vektorizácia je pomerne časovo a finančne náročná a tak sa otvára priestor pre automatizáciu tohto procesu. V dnešnej dobe existuje pomerne veľa nástrojov pre rozpoznávanie objektov a vzorov v rastrových obrázkoch spadajúcich do problematiky digitálneho spracovania obrazu. Aj keď sa v prípade máp s dopravnou infraštruktúrou jedná o pomerne rozsiahlu problematiku, v prípade kreslených máp je možné pomerne presne definovať základné vlastnosti a požiadavky, ktoré má proces automatickej vektorizácie spĺňať. V tomto príspevku je ukázaný postup procesu automatickej vektorizácie máp s dopravnou infraštruktúrou.Digital (vector) data is essential for modeling of transportation systems and solving of optimization problems. This input data is often represented by raster maps and drawings in practice and it is necessary to vectorize them. Manual and semi-automatic vectorization is expensive and time consuming and that’s why the room for automation of this process is opened. Today there are many tools for pattern recognition in raster images which are part of the digital image processing tasks. Although vectorization of maps with transportation infrastructure is complex problem, in case of drawing maps it is possible to define the main features for the process of automatic vectorizaion. In this paper process of automatic vectorization of maps with transportation infrastructure is shown

    The NORMAN Association and the European Partnership for Chemicals Risk Assessment (PARC): let’s cooperate! [Commentary]

    Get PDF
    The Partnership for Chemicals Risk Assessment (PARC) is currently under development as a joint research and innovation programme to strengthen the scientific basis for chemical risk assessment in the EU. The plan is to bring chemical risk assessors and managers together with scientists to accelerate method development and the production of necessary data and knowledge, and to facilitate the transition to next-generation evidence-based risk assessment, a non-toxic environment and the European Green Deal. The NORMAN Network is an independent, well-established and competent network of more than 80 organisations in the field of emerging substances and has enormous potential to contribute to the implementation of the PARC partnership. NORMAN stands ready to provide expert advice to PARC, drawing on its long experience in the development, harmonisation and testing of advanced tools in relation to chemicals of emerging concern and in support of a European Early Warning System to unravel the risks of contaminants of emerging concern (CECs) and close the gap between research and innovation and regulatory processes. In this commentary we highlight the tools developed by NORMAN that we consider most relevant to supporting the PARC initiative: (i) joint data space and cutting-edge research tools for risk assessment of contaminants of emerging concern; (ii) collaborative European framework to improve data quality and comparability; (iii) advanced data analysis tools for a European early warning system and (iv) support to national and European chemical risk assessment thanks to harnessing, combining and sharing evidence and expertise on CECs. By combining the extensive knowledge and experience of the NORMAN network with the financial and policy-related strengths of the PARC initiative, a large step towards the goal of a non-toxic environment can be taken

    Tensor Implementation of Monte-Carlo Tree Search for Model-Based Reinforcement Learning

    No full text
    Monte-Carlo tree search (MCTS) is a widely used heuristic search algorithm. In model-based reinforcement learning, MCTS is often utilized to improve action selection process. However, model-based reinforcement learning methods need to process large number of observations during the training. If MCTS is involved, it is necessary to run one instance of MCTS for each observation in every iteration of training. Therefore, there is a need for efficient method to process multiple instances of MCTS. We propose a MCTS implementation that can process batch of observations in fully parallel fashion on a single GPU using tensor operations. We demonstrate efficiency of the proposed approach on a MuZero reinforcement learning algorithm. Empirical results have shown that our method outperforms other approaches and scale well with increasing number of observations and simulations

    Deep Neural Networks Classification via Binary Error-Detecting Output Codes

    No full text
    One-hot encoding is the prevalent method used in neural networks to represent multi-class categorical data. Its success stems from its ease of use and interpretability as a probability distribution when accompanied by a softmax activation function. However, one-hot encoding leads to very high dimensional vector representations when the categorical data’s cardinality is high. The Hamming distance in one-hot encoding is equal to two from the coding theory perspective, which does not allow detection or error-correcting capabilities. Binary coding provides more possibilities for encoding categorical data into the output codes, which mitigates the limitations of the one-hot encoding mentioned above. We propose a novel method based on Zadeh fuzzy logic to train binary output codes holistically. We study linear block codes for their possibility of separating class information from the checksum part of the codeword, showing their ability not only to detect recognition errors by calculating non-zero syndrome, but also to evaluate the truth-value of the decision. Experimental results show that the proposed approach achieves similar results as one-hot encoding with a softmax function in terms of accuracy, reliability, and out-of-distribution performance. It suggests a good foundation for future applications, mainly classification tasks with a high number of classes

    Curated Dataset for Red Blood Cell Tracking from Video Sequences of Flow in Microfluidic Devices

    No full text
    This work presents a dataset comprising images, annotations, and velocity fields for benchmarking cell detection and cell tracking algorithms. The dataset includes two video sequences captured during laboratory experiments, showcasing the flow of red blood cells (RBC) in microfluidic channels. From the first video 300 frames and from the second video 150 frames are annotated with bounding boxes around the cells, as well as tracks depicting the movement of individual cells throughout the video. The dataset encompasses approximately 20,000 bounding boxes and 350 tracks. Additionally, computational fluid dynamics simulations were utilized to generate 2D velocity fields representing the flow within the channels. These velocity fields are included in the dataset. The velocity field has been employed to improve cell tracking by predicting the positions of cells across frames. The paper also provides a comprehensive discussion on the utilization of the flow matrix in the tracking steps

    In Situ Spectroelectrochemistry of Poly( N

    No full text
    corecore